Iteration Complexity of Feasible Descent Methods Iteration Complexity of Feasible Descent Methods for Convex Optimization

نویسندگان

  • Po-Wei Wang
  • Chih-Jen Lin
چکیده

In many machine learning problems such as the dual form of SVM, the objective function to be minimized is convex but not strongly convex. This fact causes difficulties in obtaining the complexity of some commonly used optimization algorithms. In this paper, we proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems. In particular, we are the first to prove O(log(1/ )) time complexity of cyclic coordinate descent methods on dual problems of support vector classification and regression.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Iteration complexity of feasible descent methods for convex optimization

In many machine learning problems such as the dual form of SVM, the objective function to be minimized is convex but not strongly convex. This fact causes difficulties in obtaining the complexity of some commonly used optimization algorithms. In this paper, we proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems. In partic...

متن کامل

A family of subgradient-based methods for convex optimization problems in a unifying framework

We propose a new family of subgradientand gradient-based methods which converges with optimal complexity for convex optimization problems whose feasible region is simple enough. This includes cases where the objective function is non-smooth, smooth, have composite/saddle structure, or are given by an inexact oracle model. We unified the way of constructing the subproblems which are necessary to...

متن کامل

Efficient parallel coordinate descent algorithm for convex optimization problems with separable constraints: application to distributed MPC

In this paper we propose a parallel coordinate descent algorithm for solving smooth convex optimization problems with separable constraints that may arise e.g. in distributed model predictive control (MPC) for linear network systems. Our algorithm is based on block coordinate descent updates in parallel and has a very simple iteration. We prove (sub)linear rate of convergence for the new algori...

متن کامل

On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization

The cyclic block coordinate descent-type (CBCD-type) methods, which performs iterative updates for a few coordinates (a block) simultaneously throughout the procedure, have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications include many popular statistical machine learning methods such as elastic-net regression, ridge penalized log...

متن کامل

An Improved Convergence Analysis of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization

The cyclic block coordinate descent-type (CBCD-type) methods have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications includes many popular statistical machine learning methods such as elastic-net regression, ridge penalized logistic regression, and sparse additive regression. Existing optimization literature has shown that the CBCD...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013